Even though children can only start using social media platforms from the age of 13, nearly 40% of 8-12-year-old children in the US actively use socials, often by faking ages. So, lawmakers are getting increasingly active in ensuring minors are protected from inappropriate content and that parents stay in the loop.
One of the most recent moves is California’s new Digital Age Assurance Act (AB 1043), which was signed into law by California Governor Gavin Newsom in October 2025. This new law is undoubtedly a big step toward protecting minors online by requiring age verification for digital services and apps. Shielding children from harmful content, the law aims to ensure parental consent for younger users and minimize unnecessary data collection. It’s part of a broader trend across the US, with states like Florida, Utah, and Texas also introducing their own age verification rules.
In this article, we will examine what the California Age Verification Law is, and what it means for business practices.
Overview of the California Age Verification Law
The Digital Age Assurance Act introduces a device-based age verification system designed to create safer digital environments for anyone under 18. Enforced by the California Attorney General, the law allows to impose civil penalties for non-compliance: up to $2,500 per affected child for negligent violations and $7,500 per affected child for intentional ones.
Taking effect on January 1, 2027, with a grace period for existing accounts until July 1, 2027, the law is designed to make it easier for parents to set up devices for their children and to ensure that apps and platforms know the age of their users without collecting sensitive personal data, while creating a safer online experience for minors.
The law applies to a wide net over digital service providers that provide services to California users:
- Operating System providers, like Apple’s iOS, Google’s Android, Microsoft Windows, etc., must build age verification into device account.
- Covered application stores, like Apple App Store, Google Play, and similar platforms, that distribute apps.
- Application developers: any company that owns or maintains a software application (whether it’s a social media app, game, streaming service, etc.) accessible on those operating.
What does it mean to you? If you develop or distribute apps likely to be used by Californians, especially minors, the law probably affects you.
Note: California’s Digital Age Assurance Act does not cover telecom providers or purely physical products, and it won’t hold a company liable for what independent users do – meaning if a child uses a parent’s device without permission.
„This law is about protecting our kids online while ensuring that businesses can innovate and comply without sacrificing privacy or usability.” – California’s Governor Gavin Newsom.
What the AB 1043 Law Requires

First, let’s underscore that the new law’s focus isn’t just on blocking adult content, but rather on verifying age in a privacy-conscious way and adjusting online experiences to be age-appropriate.
Therefore, businesses must implement concrete age-checking measures while honoring peoples’ privacy. Here’s what companies need to know today:
Operating systems must ask for age
Whenever someone sets up a new device or user account on a phone, tablet, or computer, the OS will need to offer an easy interface asking for the primary user’s birth date or age. For example, when a parent gives their child a new tablet, the setup process might prompt: “What is the age of the primary user of this device?” This information is used to generate an „age bracket signal”. The standard age ranges are under 13, 13–15, 16–17, and 18+.
For those fearing that they will need to upload their documents to use an iPhone or iPad, this is not going to be the case. The operating system software will not collect anything beyond age info, as the law explicitly states that they “need not collect additional information like photos of government IDs” to verify age.
Apps must check the age signal
All app developers are required to request the user’s „age bracket signal” from the OS or app store when an app is first launched on a device. Developers must treat this signal as the primary indicator of age unless they have clear evidence to the contrary. Requesting age bracket signal can be handled via a simple API call in the app’s code.
Once the app receives the signal, it is deemed to have “actual knowledge” of the user’s age. So, if an app gets a signal saying a user is 12, the company now officially knows they’re dealing with a child and must treat the user’s data and experience with appropriate care. More so, according to the law, developers can’t willfully ignore obvious signs if a user’s real age conflicts with the signal, meaning you can’t ignore if you somehow learn the age info is wrong.
“California’s children are growing up in an online world with no guardrails. AB 1043 creates a secure, privacy-first system for age verification on digital devices and applications, ensuring that businesses can offer safer experiences without over-collecting personal data.” – Senator Tom Umberg, Co-author of the Digital Age Assurance Act.
Minimize data collection and sharing
The law is very clear that, while it demands age verification, it doesn’t want companies over-collecting personal data in the process, and that all players must follow data minimization principles. The OS providers are instructed to share only the necessary info (the age bracket signal) and not to share it with any third party unless needed to comply with the law.
App developers are also forbidden to request more info than needed for age compliance or to pass over age data outside. This means you shouldn’t be asking kids for things like a full ID scan or a social security number just to confirm age, because that would violate the law. The idea is to verify age with minimal intrusion, using the built-in signals rather than each app creating its own data honeypot.
Design age-appropriate experiences
Companies working with California residents should adjust their service for younger users. For instance, if your app now knows a user is 12, you may need to disable targeted ads or certain contact features to comply with other laws.
In fact, once a developer knows a child’s age, additional legal duties can be triggered. Thus, under the federal Children’s Online Privacy Protection Act (COPPA), a service must obtain verifiable parental consent before collecting personal information from kids under 13. Similarly, the CCPA and California Age-Appropriate Design Code Act impose stricter data protections and limits on profiling minors once you know you’re dealing with a minor.
In a nutshell, the law forces businesses to proactively recognize underage users and adapt – not just give everyone the same experience and hope kids don’t sign up.
No discrimination or competitive abuse
OS providers and app stores must apply all the rules specified in the law evenly and fairly. For example, they can’t impose strict age checks on third-party apps but exempt their own apps, nor can they misuse age data to gain a competitive edge. To illustrate: Apple can’t make life hard for outside developers in the name of age verification, while giving its own apps a free pass or using the data to outdo rivals.
„Creating a statutory age assurance framework that balances privacy and usability will give parents greater peace of mind, build trust with children and families, and create consistency for businesses looking to innovate responsibly.” – Assembly Member and Co-author of AB 1043, Buffy Wicks.
Unlike some other regulations that simply throw up roadblocks, similar to the “adults only beyond this point” gates, the Digital Age Assurance Act leans toward privacy-by-design and more nuanced safeguards.
How AB 1043 Compares to Other State Laws
In the last 3 years, the US was swept by a flurry of state-level age verification laws, each tackling the issue a bit differently. For example, Texas and Utah laws outline different age verification requirements.
So, it’s instrumental for businesses operating nationwide to understand how these laws compare to California’s newest age verification legislation.
| State law | Key requirements | Example |
| Florida – Online Protections for Minors Act (HB 3) | – Requires strict age verification (ID upload or third-party checks) for any site hosting content “harmful to minors” (pornography, violent material, etc.) – Bans social media accounts for children under 14. Requires parental consent for users aged 14–17. – Allows individual lawsuits and criminal penalties for willful violations; fines up to $50,000 per incident. – Parts of the law are currently paused by injunction due to First Amendment challenges. | If a 13-year-old in Florida tries to sign up for Instagram, the platform must deny the account. A 15-year-old can join only with verifiable parental consent. |
| Utah – Social Media Regulation Act (SB 152) | – Social media platforms must verify the age of every Utah resident. – Parental consent required for all minors under 18.Imposes nighttime curfews (accounts disabled overnight). Prohibits targeted ads and certain features for minors. – Extends age-ID requirements to adult content sites similar to Texas and Louisiana. | A Utah teen’s TikTok account would automatically shut off between set nighttime hours unless a parent adjusts settings through a verified parent account. |
| Texas – H.B. 1181 (Adult Content) + App Store Accountability Act(SB 2420) | – HB 1181: Requires 18+ verification for adult-content websites (ID or government record). Upheld by the US Supreme Court as constitutional. – SB 2420 (effective Jan 1, 2026): App stores must verify user age and obtain parental consent for any minor before app downloads or in-app purchases. – Defines clear age brackets (under 13, 13–15, 16–17, 18+). – Requires parent-linked accounts & sharing age/consent data with developers. | A 16-year-old in Texas cannot download a new mobile game unless their parent’s account has granted permission. App stores act as the compliance gatekeepers. |
To sum up, California’s AB 1043 avoids hard age gates and instead uses age signals to estimate user age and apply privacy protections for minors. It’s a less intrusive approach that aligns with broader privacy trends, but it also creates unique challenges for businesses that operate across multiple states.
Compliance Challenges for Businesses
Implementing age verification mechanisms isn’t simple. Businesses must accurately determine user age, avoid collecting unnecessary data, and comply with multiple state laws simultaneously. For example, a platform serving users in California, Texas, and Utah must navigate different requirements for age verification, parental consent, and data protection.
Let’s highlight key hurdles and how companies can approach them:
- Determining user age accurately
Studies show that 86% of children under 13 have accounts on platforms that officially ban under-13 users by just inputting a false age. So, just asking „what age are you?” is not enough. Businesses will need a multi-tiered approach: a default method that’s low-friction, like an automated estimation or a third-party age check, with a fallback, like manual ID verification, if needed.
The key is balancing accuracy with user experience. Too high a barrier, and you’ll drive away legitimate users, and also anger parents. Too low, and you’ll fall short of compliance. Testing different solutions and possibly offering users choices (“Verify your age by ID or by a quick face scan”) can improve acceptance. It’s also wise to keep records of whatever age verification you do, in case regulators ask for proof that you attempted to verify ages.
Newer solutions use AI-based facial analysis and can estimate age by scanning a user’s face via webcam or phone. This is quick and document-free, but users may feel uneasy being “scanned”, and the tech, while improving, isn’t foolproof across diverse populations.
- Avoiding data overcollection
One of the cornerstones of the new California law is data minimization, meaning you can’t store every user’s birth certificate. Instead, companies are required to design verification flows that extract the minimum info needed, like a yes/no or age range confirmation, and then immediately delete or anonymize any personal data used for verification.
Also, you must clearly inform young users (and their parents) why you’re asking for this info, and that you’re not going to exploit it for other purposes. Not only is this often legally required, but it also builds user trust, especially today when public concern over data privacy is at an all-time high. And since it’s children’s data – anxiety is very high.
- Abiding by multiple and often conflicting laws
The US is a minefield of age verification legislations, differing in almost every state. California says “verify age and adjust design”; Florida says “kick off the under-14s and get IDs for adult content”; Texas and Utah say “no minors without parental consent,” etc. This can be really frustrating. Do you apply the strictest rule to everyone, or do you geofence different experiences by state?
That’s why many companies are considering broad measures like applying California’s privacy-oriented standards nationwide. Because even if not legally required elsewhere, it arguably provides a baseline of protection. However, you might still need some state-specific tweaks.
Therefore, it’s a good idea to designate a compliance officer or team to monitor new laws and update your policies. On top of state laws, watch out for any federal legislation, as there have been talks of national bills, like a proposed Protecting Kids on Social Media Act, which could set a nationwide age rule. So, your compliance strategy should be flexible and modular.
- Ensuring transparency and user trust
To avoid backlash, companies should be careful about how they ask users to prove their age. It’s all about how you communicate that. Consider using a friendly explainer written for children explaining why you’re asking them to verify their age. Also, make sure your privacy policy is up to date and easy to read regarding these changes. Next, think about user support – some percentage of users will have trouble with the process. For example, a teen might not have immediate access to a parent for consent, or an older user might be confused by an online ID check.
Finally, be transparent about what happens after verification. If you determine a user is underage, be clear about how you’ll treat their data and what features may change for them. Maybe offer a dashboard or account setting where a user (or their parent) can see and confirm the age info on file, and learn how to correct it if there’s an error.
Solution: Privacy-preserving Age Verification Technologies
Preparing for California’s new law may be daunting. But the good news is that you don’t have to invent these solutions from scratch! In fact, a number of tech companies specialize in age verification and identity validation.
That’s why a lot of businesses are turning to third-party providers that offer ready-made, compliance-friendly tools. For example, some providers offer AI-driven age estimation via camera as a service, where the user’s image is analyzed to output an age range and then deleted. Others provide a digital identity token – the user goes through a one-time verification with the provider, say by scanning their ID or using a biometric check, and then receives a secure digital certificate confirming their age that can be presented to multiple sites (this is sometimes called a “reusable age credential”).
Such solutions can dramatically reduce how many times a person has to expose their personal data. It works as a digital „proof-of-age card” that only says “18+” or “age verified” without revealing everything on your driver’s license. At Ondato, we specialize in digital identity verification. Our privacy-centric age verification tools, such as document-free verification and reusable age tokens, help businesses comply with various laws without hiccups.
In addition, by integrating solutions like Ondato, you can outsource a lot of the heavy lifting – the provider will handle the sensitive data under strict compliance, and you just get a yes/no answer or an age bracket signal to act on.
A friendly suggestion: When choosing a solution, ensure it complies with relevant privacy standards, such as GDPR if you have international users, and that it offers robust security. After all, you’re trusting them with your users’ data.
Key Takeaways
Getting ready for the California Age Verification Law can be daunting, but if you have a solid plan, you’d be just fine:
- Identify how many minors use your platform, what data you collect from them, and where gaps exist in your current age checks.
- Coordinate with OS providers for California’s new „age-signal system” or choose privacy-first partners that verify age with minimal data collection.
- Where required, especially for users under 13, create easy ways for parents to approve, revoke, or manage access.
- Turn off risky features like targeted ads or public profiles for minors and document these safeguards in your risk assessments.
- Make sure staff understand the new workflows, update your privacy policy, and communicate clearly with users about why verification matters.
- Treat this as an ongoing effort: laws will evolve, and compliance should, too. Regular audits and quick adjustments will keep you ahead of changes.
California’s new law is a forward-thinking step toward safer digital experiences. Businesses that act now, embedding age assurance, privacy, and transparency into their design, will lead in building a more trustworthy online world.